منابع مشابه
Perspectives on Self - Scaling Variable Metric Algorithms
Recent attempts to assess the performance of SSVM algorithms for unconstrained minimization problems differ in their evaluations from earlier assessments. Nevertheless, the new experiments confirm earlier observations that, on certain types of problems, the SSVM algorithms are far superior to other variable metric methods. This paper presents a critical review of these recent assessments and di...
متن کاملSelf-Scaling Variable Metric Algorithms Without Line
This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...
متن کاملOptimal conditioning of self-scaling variable Metric algorithms
Variable Metric Methods are "Newton-Raphson-like" algorithms for unconstrained minimization in which the inverse Hessian is replaced by an approximation, inferred from previous gradients and updated at each iteration, During the past decade various approaches have been used to derive general classes of such algorithms having the common properties of being Conjugate Directions methods and having...
متن کاملSELF-SCALING VARIABLE METRIC (SSVM) ALGORITHMS Part I: Criteria and Sufficient Conditions for Scaling a Class of Algorithms*t
A new criterion is introduced for comparing the convergence properties of variable metric algorithms, focusing on stepwise descent properties. This criterion is a bound on the rate of decrease in the function value at each iterative step (single-step convergence rate). Using this criterion as a basis for algorithm development leads to the introduction of variable coefficients to rescale the obj...
متن کاملSelf-Scaling Variable Metric Algorithms Without Line Search for Unconstrained Minimization*
This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: AL-Rafidain Journal of Computer Sciences and Mathematics
سال: 2007
ISSN: 2311-7990
DOI: 10.33899/csmj.2007.163992